![]() Systems and methods for the control of target individuals using remotely piloted aircraft.
专利摘要:
In this document, systems and methods for the control of target individuals using remotely piloted aircraft (UAV) are disclosed comprising the use of one or more UAVs (101) in the position of an individual. The method obtains information about the individual from one or more UAVs (101) and from external sources. The method assigns an aggression factor to the individual based on the information obtained. Following the determination that the aggression factor is higher than an aggression threshold, the method elaborates a neutralization action aimed at reducing the individual's aggressiveness factor: identifying one or more conditions of exception of the individual and selecting, among a plurality of neutralization actions that one or more UAVs (101) are capable of performing, a neutralization action based on one or more exception conditions. The method then instructs one or more UAVs (101) to perform the selected neutralization action on the hostile individuals. 公开号:CH717275A2 申请号:CH00464/20 申请日:2020-04-20 公开日:2021-09-30 发明作者:Beloussov Serguei;Oleg Melnikov 申请人:Acronis Int Gmbh; IPC主号:
专利说明:
REFERENCE TO THE RELEVANT QUESTION The present application claims the benefit of priority over United States patent application No. 16 / 833.538 filed on March 28, 2020, which has been incorporated by reference herein. TECHNICAL FIELD The present disclosure relates to the security sector and, more specifically, to systems and methods for using remotely piloted aircraft (UAV) for safety and crowd monitoring. STATE OF ART Hostile crowds can be very difficult to control. Protesters and activists display a traditionally aggressive attitude during rallies and protests. However, the suppression of any form of aggression has sometimes led to harmful actions by the police. For example, agents can use pepper spray and electric weapons (e.g., tasers). While these measures are not intended to fatally injure a target, they can cause serious injury if not used correctly. In the event of a sudden confrontation, a law enforcement officer may actually fail to decide how to react, causing harm to the hostile individual (s), innocent bystanders or himself / herself. For example, an agent may use pepper spray on an individual allergic to the spray, or they may use an electric weapon on an individual who has a medical condition of sensitivity to electrical currents (for example, the individual may be a pacemaker. ). To overcome these problems, remotely piloted aircraft (UAVs) such as drones can be used for security reasons such as theft prevention, crowd monitoring, etc. SUMMARY The present disclosure describes systems and methods for controlling hostile individuals in a harmless manner using remotely piloted aircraft (UAV) and a system for operating and controlling a fleet of such UAVs. In an exemplary aspect, a method for controlling hostile individuals using remotely piloted aircraft (UAV) includes using one or more UAVs in the location where an individual is located. The method obtains information about the individual from one or more UAVs and from external sources. External sources may include law enforcement, databases, news sources, etc. Consequently, the information obtained can be any combination of motion data, images, depth data, criminal identities, information on events (e.g. parades and concerts), etc. The method assigns an aggression factor to the individual based on the information obtained. Following the determination that the aggression factor is higher than an aggression threshold, the method elaborates a neutralization action aimed at reducing the individual's aggressiveness factor: identifying one or more conditions of exception of the individual and selecting, among a plurality of neutralization actions that the one or more UAVs are capable of performing, a neutralization action based on one or more exception conditions. The method then instructs the one or more UAVs to perform the selected neutralization action on hostile individuals. In some aspects, the method further acquires an image of the individual's face and determines an identifier of the individual by performing facial recognition on the face image. In some aspects, the identification of the one or more conditions of exception of the individual includes the generation of a VIP white list database from external sources, in which neutralization actions cannot be performed on the VIPs, and determining that the identifier is in the white list database, where the one or more exception conditions indicate that the individual is a VIP. In some aspects, the identification of the one or more conditions of exception of the individual includes the generation of a database of black list of dangerous individuals from external sources, and the determination that the identifier is in the database of black list, in which one or more exception conditions indicate that the individual is a dangerous individual. In some aspects, the identification of the one or more conditions of exception of the individual includes searching, in a medical database, the medical records of the individual using the identifier, and the identification of one or more multiple conditions of exception in the individual's medical records. In some aspects, the identification of the one or more conditions of exception of the individual includes the acquisition of an image of the individual, the identification, through object recognition, of the physical attributes of the individual and the prediction of the condition of exception on the basis of the physical attributes of the individual. In some respects, the individual is part of a plurality of hostile individuals present in the position. The method also detects at least one other individual in the plurality of hostile individuals and determines that an aggression factor of the at least one other individual is above the aggression threshold. Following this, the method identifies another condition of exception of the at least other individual and selects, from the plurality of neutralization actions, another neutralization action for the at least other individual on the basis of the other condition of exception. The method then instructs the one or more UAVs to perform the other neutralizing action at the same time as the neutralizing action, where the neutralizing action is different from the other neutralizing action. In some aspects, the neutralizing action consists of spraying a first substance on the individual using a first spray gun of the UAV and the other neutralizing action comprises spraying a second substance on the at least other individual using a second spray gun of the UAV, in which the second substance is different from the first. In some aspects, the neutralization action includes spraying a substance on the individual, also including determining, via a depth sensor, the position of the individual and the distance between the individual and the UAV. The method then calculates a projection vector indicating a direction in which to spray the substance, determines a concentration of the substance safe for the individual based on the exception condition, adjusts the distance between the individual and the UAV to achieve concentration and spray the substance along the projection vector so that the individual is exposed to the determined concentration. In some aspects, the calculation of the projection vector also includes the determination of a predicted position of the individual, based on the information obtained about the individual, where the information obtained includes the motion data and the setting of the vector of projection along a UAV position and the intended position. In some aspects, the condition of exception includes at least one of the following conditions: age, disability, allergies, pathologies, pregnancy, dependence on medical equipment, VIP status, criminal record and possession of a weapon. In some aspects, after performing the neutralizing action, the method further determines whether the individual's aggression factor has fallen below the aggression threshold based on the movement data over a period of time. Following the determination that the aggression factor has not dropped below the aggressiveness threshold, the method performs a secondary neutralizing action from the plurality of neutralization actions. [0017] In some respects, the neutralizing action is to spray a substance on the individual, make a sound, shine lights on the individual, send an image of the individual to law enforcement and call a police officer of the order. It should be noted that the methods described above can be implemented in a system comprising a hardware processor. Alternatively, the methods can be implemented using executable instructions from a non-transient computer medium. The simplified summary of the exemplary aspects above serves to enable a basic understanding of the present disclosure. This summary is not an exhaustive summary of all aspects covered and is neither intended to identify key or critical elements of all aspects, nor to outline the scope of some or all aspects of this disclosure. Its sole purpose is to present one or more aspects in simplified form as a prelude to the more detailed description of the disclosure which follows. To complete the foregoing, one or more aspects of the present disclosure include the characteristics described and exemplified in the claims. BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and form part of this specification, illustrate one or more exemplary aspects of the present disclosure and, together with the detailed description, serve to explain the principles and implementations thereof. [0021] FIG. 1A shows a situation for controlling a hostile individual harmlessly using a UAV, in accordance with aspects of this disclosure. [0022] FIG. 1B shows a situation for using a plurality of UAVs, in accordance with aspects of the present disclosure. [0023] FIG. 2A is a block diagram showing the components of the UAV, in accordance with aspects of this disclosure. [0024] FIG. 2B is a block diagram showing the components of a UAV, communicating with a control center, in accordance with aspects of the present disclosure. [0025] FIG. 3 shows a situation for controlling a plurality of hostile individuals with different defense mechanisms, in accordance with aspects of the present disclosure. [0026] FIG. 4 shows a situation for controlling a plurality of hostile individuals on the basis of zones, in accordance with aspects of the present disclosure. [0027] FIG. 5 shows a flowchart of a method to harmlessly control a hostile individual using a UAV, in accordance with aspects of this disclosure. [0028] FIG. 6 shows a flowchart of a method for controlling a plurality of hostile individuals, in accordance with aspects of the present disclosure. [0029] FIG. 7 shows a flowchart of a method for using a plurality of UAVs, in accordance with aspects of the present disclosure. [0030] FIG. 8 presents an example of a generic computer system on which aspects of the present disclosure can be implemented. DETAILED DESCRIPTION The exemplary aspects are described here in the context of a computer system, method and program product for controlling hostile individuals using remotely piloted aircraft (UAVs) and a system for operating and controlling a fleet of such UAVs. Those of ordinary skill in the art will realize that the following description is purely illustrative and is not intended to be limiting in any way. Other aspects will immediately reveal themselves to those skilled in the art who will make use of this disclosure. Reference will now be made in detail to the implementations of the exemplary aspects as illustrated in the accompanying drawings. The same reference indicators will be used, as far as possible, in all drawings and in the following description to refer to the same or similar elements. [0032] FIG. 1A shows a situation 100 to control a hostile individual harmlessly using a UAV, in accordance with aspects of this disclosure. Situation 100 includes UAV 101 and target 116. Target 116 may be a hostile individual. Note that although the examples provided in this disclosure primarily concern groups of people in confrontations, the systems and methods are applicable in the presence of a single hostile individual. For example, target 116 may be a thief attempting to break into a building. UAV 101 can be used as a security system that can detect target 116 and prevent theft. The UAV 101 includes various hardware components including the propeller 102 and the engine 104, which, combined, enable the flight capabilities of the UAV 101. The UAV 101 can be a drone, a helicopter, an aircraft , a jet or any device capable of flying. In some respects, the movements and altitude of the UAV 101 are controlled remotely by a user (for example, using a remote control) or by a remote control center. In some aspects, movements and altitude are automated (for example, using flight-based artificial intelligence (A.I.)). The UAV 101 further includes the camera 108, the cartridge 110 containing the substance 114 and the spray gun 112 for spraying the substance 114. FIG. 1 shows two cameras, three spray guns and a cartridge. However, the images are for the sake of simplicity. In some aspects, there may be more or less cameras, spray guns, and cartridges respectively (as long as there is at least one element of each device). The positioning of the camera 108 can be such as to allow a 360-degree view of the environment in which the UAV 101 flies. This result is possible with a single 360-degree camera, two cameras at at least 180 degrees, three cameras at at least 120 degrees, etc. Camera 108 may include a built-in microphone for recording ambient audio. Similarly, the spray gun 112 may comprise several spray guns, each connected to a different portion of the cartridge 110 (which, in some aspects, may contain a different substance). Spray guns can collectively cover 360 degrees of spray area. In some aspects, the spray gun 112 can be rotated and repositioned automatically or by remote control. Substance 114 represents an approach to neutralize target 116. Substance 114 can be a liquid (eg water), a nebulizer (eg pepper spray) or a gas (eg with a pungent odor) ). The purpose of substance 114 is to dissuade target 116 from reuniting with other hostile individuals and / or to induce target 116 to leave the environment 116. In some aspects, the UAV 101 may comprise loudspeakers / lights 106. The loudspeaker can be used to produce sounds, including verbal commands such as “Stop!” And alarms such as a high frequency siren. The light source can be used to illuminate the faces of individuals in dark environments (for example, during night use). The light source can also be used to temporarily emit a high intensity light to dissuade the target 116 from making aggressive movements. For example, if a high intensity light is aimed at the target 116, the target 116 can be induced not to move and protect itself from the light with its hands. [0038] FIG. 1B shows a situation 118 for using a plurality of UAVs, in accordance with aspects of the present disclosure. Situation 118 comprises a UAV base 120 which may contain a plurality of UAV 101. The UAV base 120 may be a fixed warehouse which also includes the UAV control center 122. The control center 122 comprises one or more servers configured to employ one or more UAV 101s at an individual's location (for example, where the collision is taking place). The control center 122 servers can track the positions of each UAV 101 and communicate wirelessly with each UAV 101 to perform a variety of processing commands. In some respects, a sub-group of UAVs can be stored on a mobile base 124 which carries the sub-group of UAVs to the location where the individual is (or nearby). The mobile base 124 may also comprise servers that can communicate both with the UAV subgroup and with the control center 122 to transmit information between the two entities. In some aspects, the servers present on the mobile base 124 can create an ad hoc network with the plurality of UAVs, without the need to communicate with the control center 122. In some aspects, the control center 122 may receive global positioning system (GPS) data from each UAV of the plurality of UAVs to determine their current position. The control center 122 can then transmit the route data to guide the UAVs to the location. In some aspects, the control center 122 can designate one UAV from the plurality of UAVs as the UAV manager. The UAV manager can be configured to receive GPS data from each of the other UAVs of the plurality of UAVs and transmit them to the control center 122. The UAV manager can then receive instructions from the control center 122 and transmit the instructions to the other UAVs. Each UAV of the plurality of UAVs can be assigned an identifier. Therefore, the control center 122 / UAV manager can transmit messages with a header indicating the UAV identifier to give instructions. The control center 122 can monitor external sources such as the press and law enforcement reports to detect events that require neutralization by one or more UAVs. For example, the control center 122 can detect a collision in a particular location. The control center 122 may refer to predetermined usage patterns to select a number of UAVs to be deployed. The predetermined usage patterns can be based on the number of people potentially to be neutralized (for example, the number of participants in the collision), on the meteorological conditions (for example, if there are bad weather conditions), on the number of agents of the forces of the on-site order, the distance from the location and the number of UAVs currently available to be deployed at the UAV base 120 and / or the mobile base 124. For example, a diagram may indicate the dispatch of five UAVs if there are at most 10 people to be neutralized, if the weather forecast foresees rain, if there are a maximum of three police officers on the spot, if the distance is maximum 20 miles and if at least ten UAVs are available for use. The control center 122 can use, for example, a binary decision tree to arrive at a usage pattern of the plurality of usage patterns. In some aspects, there may be additional criteria for determining the number of UAVs to be deployed, such as whether the location is in a rural or urban setting and how many UAVs have traditionally been used in similar events (and whether the event attendees are been successfully neutralized or not by UAVs). In some aspects, the control center 122 may present an interface to the operators 126, who are users of the control center 122. The interface can issue all instructions that the control center 122 recommends to be carried out by each single UAV. Upon receipt of confirmations by the operators 126 through the interface, the control center 122 sends the instructions to the UAV. This adds an extra layer of verification to ensure the UAV does not execute a careless instruction. [0042] FIG. 2A is a block diagram showing the components of the UAV 200, in accordance with aspects of this disclosure. The UAV 200 is the same as the UAV 101 and is divided into two parts: the hardware 201 and the software 202. Similarly, the components of the hardware 201 are comparable to the components described in the UAV 101. In this case, the diagrams of the camera 204 to camera 108, schematics of propeller 210 to propeller 102, schematics of motor 208 to motor 104, schematics of spray gun 214 to spray gun 112, schematics of cartridge 212 to cartridge 110 and schematics from speaker / light 216 to speaker / light 106. UAV 200 also includes depth sensor 206. In some aspects, depth sensor 206 is part of camera 204. In some aspects, depth sensor 206 is a single hardware component of the UAV 200. [0043] One goal of the UAV 200 is to neutralize a hostile individual in a harmless way. To do this, the specific individual must be identified. Via the camera 204 and the depth sensor 206, the UAV 200 detects an individual in an environment. In particular, the acquired images are inserted into the facial recognition module 220 and the object detector 228. If the facial recognition module 220 and / or the object detector 228 identify a face and / or a human being, the UAV 200 it can confirm that an individual is present in the environment (for example, target 116). The UAV 200 can then assign an aggression factor to the individual based on the individual's movement data. For example, the object detector 228 can be configured to monitor a number of movements of the target 116. The object detector 228 can identify the user in a first frame taken by the camera 204 and determine the amount of pixels that change over a period of time. In some aspects, the object detector 228 can generate a visual box around the target 116 in a particular frame and evaluate the amount of pixels that change over a period of time within the target area. In order to eliminate anomalies (for example, changes in the background), the object detector 228 can use computer vision techniques such as edge detection to generate an edge around the target 116 (the edge conforming to the body of the target 116) . All other pixels within the visual box that are outside the border can be set to a constant value. In addition, the object detector 228 can lock onto a specific part of the body of the target 116. For example, the object detector 228 can set the nose of the target 116 as the center point of the visual box, such that the edge of the visual box is relative to a single point. Thus, a pixel change indicates that the user's body is moving. Theoretically, the more the target 116's body moves (for example, the arms can shake and the legs can kick), the more the user is classified as aggressive. The aggressiveness factor can then be proportional to the number of pixels that change in the visual pane, given these image adjustments and hooks. This means that if 70% of the pixels change, the aggressiveness factor is equal to 70%. The aggression factor can be compared with a predetermined aggression threshold (for example, 50%). Following the determination that the aggression factor is above an aggression threshold, the UAV 200 elaborates a neutralization action aimed at reducing the individual's aggressiveness factor. [0045] Note that, in some respects, the UAV 200 does not rely solely on comparing the aggression factor and an aggression threshold to decide whether to prepare a neutralization action. For example, the object detector 228 can analyze the gestures performed by a user to determine if a neutralization action should be taken. Gestures that trigger a neutralization action by the UAV 200 include, but are not limited to, a throwing move (for example, if target 116 attempts to throw an object at the UAV 200), an attack attempt ( for example, if the target 116 throws a punch or a kick) and a swinging motion. In some aspects, the presence of a weapon (for example, a pistol, a knife, a stick, etc.) can be identified by the object detector 228. Upon detection of the presence of a weapon in the individual's possession , the UAV 200 can prepare a neutralization action. [0046] In order to detect gestures and weapons, the object detector 228 can use the machine learning module 222, which can comprise a plurality of machine learning algorithms. An algorithm can be used to identify gestures, where the object detector 228 identifies an individual and provides the motion data associated with the individual to enter the parser 224 along with the request to identify aggressive gestures. The gesture detection algorithm can be prepared on the basis of a plurality of data structures comprising motion data (e.g., three-dimensional position coordinates of key body parts such as hands, arms, legs, feet and head identified by the motion detector. 228 objects and associated time stamps over a period of time) and classified gestures. Some of these gestures can be aggressive (for example, throwing a punch), while other gestures can be non-aggressive (for example, shaking hands). The input parser 224 can take the raw data from the object detector 228 and generate a data structure compatible with the gesture algorithm. Classifier 226, which represents the real algorithms, stores the prepared weights. Classifier 226 receives the input data parsed by the parser, applies its weights according to a classification algorithm (for example, the Bayesian classifier) and indicates whether a gesture is aggressive or non-aggressive. [0047] Following the determination that an individual is aggressive, the UAV 200 prepares a neutralization action. A neutralizing action may include, but is not limited to, (1) spraying a substance on the individual, (2) making a sound, (3) shining lights at the individual, (4) sending an image of the individual to the law enforcement and (5) call a law enforcement officer. [0048] One of the objectives of this action is to calm the user or induce him to leave the environment. Technically, this is achieved when the aggression factor is brought below the aggression threshold and / or when classifier 226 determines that the individual's most recent movements represent non-aggressive gestures. Another goal of the neutralization action is to ensure that the individual does not suffer any harm from the UAV 200. The target individuals may have different medical conditions that make them susceptible to harm. For example, the use of flashing lights can be harmful if the individual has photosensitivity, the emission of loud sounds can be harmful if the individual wears a very sensitive hearing aid and the spraying of a particular substance on the individual it may be harmful if you are allergic to the substance or if you are prone to fainting when exposed to high concentrations of the substance. The long-term effects are also very important. For example, the elderly, minors and pregnant women are particularly sensitive to disclosure to particular substances (e.g. pepper spray) and therefore cannot be exposed to prevent growth, development and longevity problems. [0050] Unlike medical conditions, an individual classified as aggressive can in fact be part of a group of "very important people" (VIPs) who are not to be targeted. For example, the individual may be a law enforcement officer, politician, celebrity, etc. An individual can also be a person belonging to a group of dangerous individuals (for example, known criminals and terrorists) who need to be arrested. [0051] To achieve these objectives, the UAV 200 identifies a condition of exception of the individual. An exception condition refers to any attribute of the individual that can serve as a deterrent to using a particular neutralizing action - particularly because the neutralizing action can be ineffective, harmful, or can cause a criminal to flee (evading thus the arrest). The conditions of exception can be a combination of age, disability, allergies, medical conditions, pregnancy, dependence on medical equipment, VIP status, criminal record, and possession of a weapon. In some aspects, the UAV 200 may use the facial recognition module 220 to determine a user identifier (e.g., name, ID number, etc.) by referencing a database that maps facial images into based on these identifiers. By comparing facial images and determining an identifier, the facial recognition module 220 can refer to the database 218, which includes public health records, criminal records and VIP white lists, to determine at least one exception condition. The database 218 can be pre-generated by the control center 122, which collects information on a population from external sources such as hospitals, motor vehicles, law enforcement and the press. For example, DMV records can provide information on age and medical conditions, such as the use of corrective devices for sight and hearing. Law enforcement can provide lists of people who need to be arrested, as well as names and photos of law enforcement members. The control center 122 can generate white lists (for VIPs who are not to be targeted) and black lists (for dangerous individuals) based on this information. To simplify reporting, the database 218 can be in the form of a data structure that answers questions whose answer is "yes" or "no" for different fields. An example of this data structure is shown below: Individual Age Vision problems Hearing problems Movement problems Allergic to substance X Allergic to substance Y Blacklisted Insent or Whitelist Gun owner Pregnant John Doe 35 No No No Yes No No No No No Table 1: Database [0053] For each individual, each field above in which a "yes" is present is considered an exception condition. In some respects, age can be classified as an exception condition if it is under 18 or over 65 (these numbers can be changed). Note that the fields shown in the previous table are for example only. The table may include additional fields, such as the dosage limits of a particular substance that the individual can tolerate. In some aspects, the object detector 228 can be used to generate the database. For example, the object detector 228 can analyze an image of the individual to identify the physical attributes of the individual. Physical attributes include objects that can be used to predict the individual's exceptional condition. For example, the object detector 228 can identify objects that indicate motor problems (for example, crutches, wheelchairs, etc.), hearing problems (for example, a hearing aid), and visual problems (for example, sunglasses and a stick). Physical attributes also include age, height, weight, and visual abnormalities (e.g., a cast, collar, missing limbs, etc.) In some respects, depth sensor 206 can be used to roughly calculate the user's height and weight (based on the volume calculated by the depth sensor). In addition, the camera 204 can take a picture of the user's face and determine the age of the individual using an age approximation algorithm in the machine learning module 222. The input parser 224 can receive the height, the weight and facial image of the individual and the classifier 226 can process an age according to these three data (for example, the algorithm can be prepared on a plurality of pre-classified images, heights and weights, of a population). [0055] Suppose that an individual has been identified by the facial recognition module 220. The UAV 200 does not find a report relating to the individual in the database 218. Following the determination of the absence of reports, the object detector 228 and the machine learning module 222 can create a report for the individual in the database 218. The object detector 228 can detect an asthmatic can in the hands of the individual. Based on height, weight and facial features, the age of the individual can be approximated as 25 years. The following report can then be generated: Person X 25 No No No Yes No No Table 2: New entry into the database [0056] Note that just as certain neutralization actions may be harmful to an individual, some neutralization actions may be ineffective. For example, if an individual's condition of exception indicates that he is blind, the use of lights against the individual will not diminish the individual's aggression. Likewise, if the individual is completely deaf, making a loud sound will be ineffective. [0057] The UAV 200 can monitor, through the defense module 232, a plurality of neutralization actions that the UAV 200 is able to perform. For each action, the defense module 232 can store additional information about the action, such as medical conditions against which the action is ineffective and harmful. Additional information may also include battery capacity and life data. For example, the 212 spray cartridge can monitor how much spray has already been used and how much is left. Similarly, speaker / light 216 and spray gun 214 can be used sparingly based on the battery life of the UAV. In some respects, the battery can be reserved for each of these actions. The defense module 232 can store this data in a data structure. Below is an example of data structure: Light 25% Prone to photosensitivity disturbances Visual disturbances 3 0.5% Loudspeaker 25% Hearing sensitivity Hearing problems 4 0.1% Spray 70% Allergy 1 2% substance X Sensitivity to substance in air Seniors Minors Substance spraying Y 10% Seniors Minors 2 2% Call from authorities 4/5 Signal reception 5 Table 3: Neutralization actions [0058] Table 3 is provided by way of example. An expert in the field would appreciate that more or fewer shares may be available for any UAV. The UAV 200 selects, from the plurality of neutralization actions that the UAV is able to perform, a neutralization action that does not harm the individual based on the condition of exception and is effective against him. In some respects, each action is assigned a priority value. When judging which action to choose, the UAV 200 can select the action with the highest priority (i.e. 1, followed by 2, 3, 4 and 5). For example, the UAV 200 may first decide to spray substance X on the individual classified in table 2. Person X suffers from asthma and is classified as sensitive to airborne substances according to table 2. Therefore, the UAV will not select substance X (e.g. a gas with a pungent odor). The UAV 200 may then consider spraying substance Y on person X. Since there is no indication of harmfulness and the capacity is sufficient, the UAV 200 can spray substance Y (eg water) on person X. In some aspects, for the individuals included in the white list and black list, the UAV 200 can select a predetermined neutralization action. For example, for VIPs on a white list, the UAV 200 may not perform any neutralization action. Conversely, for dangerous individuals on a black list, the UAV 200 can contact law enforcement immediately. Regarding spraying, each cartridge 212 can start with 100% of each substance. With the execution of actions, this ability can decrease at a rate of reduction. In this particular example, selecting the spray option on an individual, the full capacity for that particular substance in the 212 cartridge is reduced by 2%. Similarly, flashing lights aimed at an individual consume 0.5% of the allocated battery, and the sound of the speakers consumes 0.1% of the allocated battery. Regarding the call of the authorities, the UAV 200 can access, using the network adapter 234, a network such as the Internet or a mobile network. The signal strength (or connection strength) of the network can be rated on a scale of one to five. In this particular example, the signal strength is 4 out of 5. The UAV 200 can compare this value with a signal strength threshold (eg 1 of 5). If the signal strength is below the threshold, the UAV 200 may not be able to perform this action. For example, the UAV 200 may be in a place (for example, a forest) where it is not possible to send messages and call due to poor reception. In some aspects, after performing the neutralizing action, the UAV 200 determines whether the individual's aggression factor has fallen below the aggression threshold based on the movement data over a period of time. For example, the UAV 200 may spray target 116 (which may be Person X) with substance 114 (which may be substance X). After a period of time, for example 30 seconds, the UAV 200 can re-evaluate the aggressiveness factor of the target 116 by the methods discussed above. After determining that the aggression factor has not dropped below the aggression threshold, the UAV 200 can perform a second neutralizing action from the plurality of neutralizing actions. For example, if target 116 is still screaming and making offensive movements (e.g., kicks), the UAV 200 can select a higher priority action that is effective against target 116 and does not harm it. According to Tables 3 and 2, the next action will be to aim flashing lights aimed at target 116 (eg to disturb his vision). [0063] In some respects, if the neutralization action involves spraying a substance on the individual, the UAV 200 determines, through a depth sensor 206, the position of the individual and the distance between the individual and the 'UAV 200. For example, the UAV 200 can fix the position of depth sensor 206 as point of origin (P1) in three-dimensional space (xl, yl, zl) where xl = yl = zl = 0. At the position of the individual (for example, the individual's nose) (P2) can be assigned a three-dimensional coordinate (x2, y2, z2) with respect to the point of origin. The distance between the origin point and x2, y2, z2 can be determined using a distance formula: The UAV 200 can also calculate a projection vector indicating the direction in which to spray the substance. The FVO 200 can determine a concentration of the substance safe for the individual based on the exception condition. For example, if the distance between P1 and P2 is less than a threshold distance (for example, 1 foot), the force with which a substance such as pepper spray is sprayed can be very harmful to the individual. This is because the concentration of the substance is too high. As the distance between P1 and P2 increases, the substance is diluted in the air and is less concentrated. In some aspects, the defense module 232 can determine a relationship between the concentration of a substance and the distance between P1 and P2. Based on this ratio, the UAV 200 can adjust the distance between the individual and the UAV 200 to achieve the desired concentration. In some respects, the desired concentration can be reported in the database 218 for a specific individual. In some respects, the desired concentration is universal so that a specific distance must be maintained for all individuals when the particular substance is sprayed. The motion module 230 receives the distance information and sends the instructions to the motor 208 and the propeller 210 to allow the desired motion (e.g., left, right, up, down). As a result, the UAV 200 sprays the substance along the projection vector in such a way that the individual is exposed to the desired concentration. In some aspects, the depth sensor 206 may acquire motion data which indicates, for a certain period of time, that the individual is moving (e.g., running or walking). In order to lock onto the individual, when the projection vector is calculated, the UAV 200 determines a predicted location of the individual. For example, the individual can move at 3 miles per hour towards P3 in three-dimensional space. The UAV 200 can determine that in a predetermined time (e.g. 5 seconds), based on the individual's movements, the individual will be at P4. The UAV 200 then determines that the projection vector must lie along a position of the UAV (P1) and the predicted position (P4). After 5 seconds, the UAV 200 sprays the substance at the desired concentration - maintaining a particular distance - towards P4. Based on the movements of the individual, the individual must then be exposed to the substance. In some aspects, the object detector 228 can also identify the presence of obstacles in the path between the individual and the UAV 200. The positions of the obstacles are sent to the motion module 230, which determines a path of movement towards the individual and, consequently, send the instructions to the propeller 210 and the motor 208. [0068] FIG. 2B is a block diagram showing the components of the UAV 246 communicating with the control center 238 of the UAV, in accordance with aspects of this disclosure. In some aspects, the modules shown in software 202 of FIG. 2A can be thin client applications that communicate with the counterpart's applications on one or more servers of the control center 238 (same as the control center 122). For example, instead of performing image analysis and machine learning calculations aboard the UAV 200, the UAV 200 may comprise the network adapter 234 which can communicate with one or more servers on a network (e.g., the Internet ) to perform the calculations and provide the results. [0069] In FIG. 2B, UAV 246 represents a simplified version of the UAV 200 of FIG. 2A. In FIG. 2A, the UAV 200 performed all calculations and processing on board via software 202. Software 202 of the UAV 246, however, moves modules such as database (s) 218, facial recognition module 220 and the machine learning module 222 to the software 244 of the control center 238 of the UAV. Hardware 201 components can be shared by both UAV 200 and UAV 246. Hardware 240 of control center 238 may include network adapter 242, which is configured to transmit and receive network traffic respectively , such as instructions and data. The module called control center communication 236 can be configured to receive and analyze the instructions coming from the control center 238. The control center communication device 236 can also create messages with payloads containing raw data collected by the hardware 201 ( to be transmitted to the control center 238). The module called UAV communication 248 can be configured to receive the messages and then distribute them to modules 218, 220 and 222, (for example, the image data can be forwarded to the facial recognition module 220). The communication module UAV 248 can also be configured to receive information from modules 218, 220 and 222 and generate instructions for transmission to the UAV 246. For example, an instruction may be to perform a certain neutralization action, to trace the location of an individual (for example, a criminal), to move to a different location, etc. Object detector 228 can be used to avoid collisions with neighboring UAVs in a plurality of UAVs. The object detector 228 can relay the collision detection information to the motion module 230, which performs the necessary movements to avoid a collision. For example, object detector 228 can determine that an adjacent UAV is within a threshold distance (e.g., 3 feet) and then the UAV 246 must move to another location. In some aspects, movement instructions can be received by the control center 238, which is configured to monitor the individual positions of all UAVs deployed and prevent collisions. [0072] FIG. 3 shows a situation 300 for controlling a plurality of hostile individuals with different defense mechanisms, in accordance with aspects of the present disclosure. So far the focus has been on a single individual (eg target 116). However, the systems and methods described in FIGS. 1 and 2 are applicable in situations involving a plurality of hostile individuals. In the situation 300, in addition to the target 116, there is the target 304. In some aspects, the UAV 101 can detect at least one other individual in the plurality of hostile individuals (for example, the target 304). UAV 101 can determine (for example, using camera 204, facial recognition module 220, object detector 228, and machine learning module 222) that an aggression factor of the at least one other individual is also greater than the aggression threshold. Consequently, the FVO 101 can identify another exception condition of at least one other individual. For example, the UAV 101 can refer to the database 218 to identify the exception condition of the target 304. The UAV 101 can then select, from the plurality of neutralization actions, another neutralization action that does not harm the 'at least one other individual based on another condition of exception. Suppose UAV 101 contains two substances in its cartridge: substance 114 (eg pepper spray) and substance 302 (eg water). The UAV 101 can determine that one of its neutralizing actions can be used on target 116 without harming target 116. However, for target 304, the UAV 101 can determine, from the database information, that target 304 is allergic with pepper spray. For this reason, the UAV 101 can choose to spray water on the target 304. In some respects, the UAV 101 may perform another neutralizing action by the FVO simultaneously with the neutralizing action. For example, the UAV 101 can identify target 116 and target 304. The UAV 101 can then identify the exception condition of each target and select an appropriate neutralization action. After selection, the UAV 101 can perform both actions simultaneously. In some aspects, the UAV 101 can select a single target from a plurality of hostile individuals and perform the actions of target identification, aggression detection, determination of the exception condition, selection of an action and execution of the 'action. After performing the action, the UAV 101 can direct its attention to a different target of the plurality of hostile individuals and perform the actions identified above for that target. In some aspects, the UAV 101 can evaluate the aggression factors of all individuals in the plurality of hostile individuals and classify each individual according to the factors of aggression to ensure that the most hostile individuals are adequately neutralized before moving on to the next target. In some aspects, the UAV 101 can perform the above actions for different targets in the procedure such that, for example, in determining the exception condition for target 116, the UAV 101 can already spray the substance 302 on target 304. This allows the UAV 101 to neutralize multiple targets effectively because, for some targets, the UAV 101 may take longer to determine an exception condition, or it may need to perform further movement to reach the target . [0078] FIG. 4 shows a situation 400 for controlling a plurality of hostile individuals on the basis of zones, in accordance with aspects of the present disclosure. In some situations, there may be several individuals in the plurality of hostile individuals. Thus, performing an action on each target individually can be ineffective. In the 400 situation, the UAV 101 captures the image 401. The image 401 can be a real-time video image, a frame of a video, or a photo taken at a certain time. Using the object detector 228 and the facial recognition module 220, the UAV 101 can identify each individual 402, 404, 406, 408 and 410. For each individual, the UAV 101 can also identify an exception condition. The UAV 101 can then compare the exception conditions of each individual to identify individuals with corresponding exception conditions. For example, the respective conditions of exception determined by UAV 101 may indicate that individual 406 is an elderly woman with hearing sensitivity, individuals 402 and 404 are children, and individuals 408 and 410 are healthy adults who do not have exceptional conditions. . Based on the exception conditions, the UAV 101 can generate zones 1, 2 and 3. As illustrated in situation 400, zone 1 includes children, zone 2 includes the elderly woman and zone 3 includes adult men. During the generation of the zones, the UAV 101 can use the camera 208 and computer vision techniques such as edge detection to group individuals in proximity (within a radial threshold distance such as 5 feet). In some respects, the UAV 101 also determines whether individuals with corresponding exception conditions are not separated from another non-aggressive individual or with a different exception condition. For example, if the elderly woman is between the two children, the UAV may determine that zone 1 cannot be created due to the different exception condition. In that case, three zones would be created - one for each of the 402, 404, and 406 individuals. [0081] After generating the zones, the UAV 101 selects and performs a neutralization action. For example, the UAV 101 can spray pepper spray in zone 3, it can make a loud sound directed specifically in zone 1, and it can emit a flashing light in zone 2. [0082] FIG. 5 shows a flowchart of a method 500 for controlling a hostile individual harmlessly using a UAV, in accordance with aspects of this disclosure. At 502, the UAV 101 detects an individual in an environment. At 504, the UAV 101 assigns an aggression factor to the individual based on the individual's movement data. At 506, the UAV 101 determines whether the aggression factor is above an aggression threshold. Following the determination that the factor is not above the threshold, method 500 terminates. However, following the determination that the factor is above the threshold, method 500 switches to 508, where the UAV 101 prepares a neutralization action aimed at reducing the individual's aggressiveness factor. At 510, the UAV 101 identifies a condition of exception of the individual. At 512, the UAV 101 selects, from a plurality of neutralization actions that the UAV is capable of performing, a neutralization action based on the exception condition. At 514, the UAV 101 performs the neutralization action. [0084] FIG. 6 shows a flowchart of method 600 for controlling a plurality of hostile individuals, in accordance with aspects of the present disclosure. At 602, the UAV 101 captures a real-time image of an environment. At 604, UAV 101 detects a plurality of individuals in the real-time image. At 606, the UAV 101 selects an individual from the plurality of individuals. At 608, the UAV 101 assigns an aggression factor to the individual based on the individual's movement data. At 610, the UAV 101 determines whether the aggression factor is above an aggression threshold. Following the determination that the value is not above the threshold (i.e. no neutralization action is required for that particular individual), method 600 switches to 612, where the UAV 101 determines whether there are other individuals in the plurality that should be considered. For example, the plurality of individuals can include 10 individuals. The first of these individuals may not show any aggression. Upon determining that there are other individuals to consider, method 600 reverts to 606. Suppose at 610, the UAV 101 determines that the value is above the threshold. As a result, method 600 passes to 614, where the UAV 101 identifies an individual's condition of exception. At 616, the UAV 101 selects a neutralizing action from a plurality of neutralizing actions that the UAV is capable of performing. For example, the UAV 101 may be capable of performing six different actions. The FVO 101 can select the first of the six actions. At 618, the UAV 101 determines whether the neutralization action will satisfy the exception condition. If the action does not satisfy the condition, method 600 reverts to 616, where the UAV 101 selects a different action (for example, the second of the six). Following the determination that the action satisfies the exception condition, at 620 the UAV 101 assigns the neutralizing action to the individual. In the event that the loop between 616 and 618 reaches the last action, the UAV 101 automatically assigns that neutralizing action to the individual. The latter action can be harmless for any individual (such as taking a photo and sending it to the authorities or saving the photo in memory for later access when, for example, there is a field and the photo can be sent) and therefore it can be a universal action (i.e. it satisfies all the conditions of exception). From 620, method 600 returns to 612, where the UAV 101 determines whether there are other individuals to be considered in the plurality of individuals. If there are no other individuals, Method 600 moves to 622, where the UAV 101 tracks the locations of each individual who has been assigned a neutralization action. At 624, the UAV 101 performs the respective neutralization action assigned by the UAV on each individual based on its location. For example, the UAV 101 can determine the 3D coordinates of the respective individuals and then perform an action such as spraying a substance on a respective individual by aiming the spray gun at the respective individual's location. [0089] FIG. 7 shows a flowchart of method 700 for using a plurality of UAVs, in accordance with aspects of the present disclosure. Note that method 700 is just one example of how the control center 122 can decide the number of UAVs to use. For example, the control center 122 may consider other factors, or may refer to predetermined schemes as discussed in FIG. 1B. At 702, the control center 122 receives a request to use a plurality of UAVs in a location where hostile individuals are present. At 704, the control center 122 determines if there are more than X people (e.g. X = 5) in the location. Following the determination that there are, method 700 moves to 706, where the utilization calculation (DC) is set to A (for example, A = X = 5). If there are none, method 700 advances to 708, where the DC is set to B (for example, B = 3), which can be less than A. From 706 and 708, method 700 passes to 710, where the control center 122 determines if the weather conditions are bad (for example, it rains, there is a lot of wind). If conditions are bad, at 712 the control center 122 increases DC with C (for example, C = 3). If conditions are not bad, at 714 the control center 122 increases DC with D which may be less than C (for example, D = 0). From 712 and 714, method 700 passes to 716, where the control center 122 determines whether there is at least a Y number of law enforcement officers in the location. Following the determination that there are, at 718 the control center 122 increases DC with E (e.g., E = 1). Upon determining that there are none, control center 122 increases DC with F which may be greater than E (e.g., F = 3). From 718 and 720, method 700 passes to 722, where control center 122 determines whether hostile individuals carry weapons. Following the determination that they lead to, at 724 the control center 122 increases DC with G (for example, G = 5). Following the determination that they do not carry them, at 726 the control center 122 increases DC with H which may be less than G (for example, G = 1). At 728, the control center 122 employs the DC number of UAVs from the base 120 and / or the mobile base 124. In some aspects, the control center 122 can employ UAVs iteratively based on the provisional DC. For example, the control center 122 may employ only one UAV to scan the location. As the single UAV sends information (e.g., weather conditions, number of hostile individuals, number of agents, number of armed individuals), the control center 122 can employ more UAVs continuously. For example, after setting the DC for A to 706, the control center 122 may employ an A number of UAVs. Thereafter, the control center 122 may employ an additional 712 UAV number C. [0095] FIG. 8 is a block diagram illustrating a computer system 20 on which can be implemented, according to an exemplary aspect, aspects of systems and methods to control hostile individuals in a harmless way using remotely piloted aircraft (UAV) and systems and methods for the use and control of a fleet of such UAVs. The computer system 20 can take the form of multiple computing devices or a single computing device, such as the UAV 200. For example, the hardware 201 of the UAV 200 may include the computer system 20 for running the software 202 and the integration of the different software modules 202 with peripheral devices such as the camera 204 and the engine 208. In some aspects, the computer system 20 can be comprised in the UAV control center 238. [0096] As shown, the computer system 20 comprises a central processing unit (CPU) 21, a system memory 22 and a system bus 23 which connects the various components of the system, including the memory associated with the processing unit central 21. The system bus 23 can comprise a bus memory or a bus memory controller, a peripheral bus and a local bus capable of interacting with any other bus architecture. Bus examples may include PCI, ISA, PCI-Express, HyperTransport ™, InfiniBand ™, Serial ATA, I <2> C, and other suitable interconnects. Central processing unit 21 (also called processor) may include a single or a series of processors with one or more cores. Processor 21 may execute one or more executable computer codes that implement the techniques of the present disclosure. For example, any command / step discussed in FIGS. 1A-7 can be executed by processor 21. Processor 21 can be part of hardware 201 or hardware 240. System memory 22 can be any data storage memory used here and / or computer programs executable by the processor 21. System memory 22 may include volatile memory such as random access memory (RAM) 25 and non-volatile memory such as read-only memory (ROM) 24, flash memory, etc. or a combination of these. The basic input / output system (BIOS) 26 can store basic procedures for transferring information between elements of the computer system 20, such as those at the time of loading the operating system with the use of ROM 24. The computer system 20 may include one or more storage devices such as one or more removable storage devices 27, one or more non-removable storage devices 28, or a combination thereof. The one or more removable storage devices 27 and non-removable storage devices 28 are connected to the system bus 23 via a storage interface 32. In one aspect, storage devices and corresponding computer storage media are independent modules from power for storing instructions, data structures, program modules and other computer system data 20. 22 system memory, 27 removable storage devices, and 28 non-removable storage devices can use a variety of computer storage media . Examples of computer storage media include onboard memory such as cache, SRAM, DRAM, zero capacitor RAM, dual transistor RAM, eDRAM, EDO RAM, DDR RAM, EEPROM, NRAM, RRAM, SONOS, PRAM; flash memory or other memory technologies such as solid state drives (SSDs) or flash drives; magnetic cassettes, magnetic tapes and storage on magnetic disks such as, for example, in hard disk drives or floppy disks; optical storage such as, for example, in compact disks (CD-ROMs) or versatile digital discs (DVDs); and any other medium which can be used to store the desired data and which can be accessed by the computer system 20. The system memory 22, the removable storage devices 27 and the non-removable storage devices 28 of the computer system 20 can be used to store an operating system 35, additional program applications 37, other program modules 38 and data program 39. The computer system 20 may include a peripheral interface 46 for communicating data from input devices 40, such as keyboard, mouse, stylus, game controller, voice command device, touch device or other peripheral device , as a printer or scanner through one or more I / O ports, such as a serial port, parallel port, universal serial bus (USB), or other peripheral interface. A display device 47, such as one or more monitors, projectors or integrated displays, can also be connected to the system bus 23 through an output interface 48, such as a video adapter. In addition to the display devices 47, the computer system 20 can be equipped with other peripheral output devices (not shown), such as speakers and other audiovisual devices. The computer system 20 may operate in a network environment using a network connection to one or more remote computers 49. The remote computer (s) 49 may consist of local workstations or servers comprising most or all the elements mentioned above in the description of the nature of a computer system 20. Other devices may also be present in the computer network, such as, but not limited to, routers, network stations, peer devices or other network nodes. The computer system 20 may comprise one or more network interfaces 51 or network adapters for communicating with remote computers 49 through one or more networks, such as a local computer network (LAN) 50, a wide-range computer network (WAN), an intranet and the Internet. Examples of network interface 51 may include an Ethernet interface, a Frame Relay interface, a SONET interface, and wireless interfaces. [0100] Aspects of this disclosure may be a system, method and / or product of a computer program. The computer program product may include computer storage medium (or media) with instructions for computer programs for a processor to perform aspects of this disclosure. [0101] The computer storage medium may be a tangible device capable of storing and storing program code in the form of instructions or data structures accessible by a processor of a computer device, such as the computer system 20. The storage medium computer may be an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any appropriate combination of these. By way of example, this computer storage medium may comprise a random access memory (RAM), a read-only memory (ROM), an EEPROM, a read-only memory of a portable compact disc (CD-ROM), a versatile digital disk (DVD), a flash memory, a hard disk, a portable disk, a memory stick, a floppy disk, or even a mechanically encoded device such as punch cards or structures embossed in a groove with recorded instructions. As used herein, a computer storage medium is not intended to be transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or transmission medium, or electrical signals transmitted through a wire. [0102] The computer program instructions described in this document can be downloaded to the respective computer devices from a computer storage medium or to an external computer or external storage device via a network, such as the Internet, a local network, a network wide range and / or a wireless network. The network may include copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and / or onboard servers. A network interface of each computer device receives computer program instructions from the network and forwards computer program instructions for storage on a computer storage medium within the respective computer device. [0103] Computer program instructions for performing the operations of this disclosure may be assembly instructions, instruction sets (ISA), machine instructions, machine dependent instructions, microcodes, firmware instructions, status setting data, or source codes or object codes written in any combination of one or more programming languages, including an object-oriented programming language and conventional procedural programming languages. The instructions of computer programs can be executed entirely on the user's computer, partly on the user's computer, as a standalone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter case, the remote computer can be connected to the user's computer through any type of network, including a LAN or WAN, or the connection can be made to an external computer (for example, through the Internet). In some embodiments, electronic circuits that include, for example, programmable logic circuits, field programmable gate arrays (FPGAs), or programmable logic arrays (PLAs) can execute the instructions of computer programs using the status information of the instructions. computer programs to customize electronic circuits, in order to carry out aspects of this disclosure. [0104] For various aspects, the systems and methods described in the present disclosure can be treated in terms of modules. The term "module" used here refers to an actual device, component or arrangement of components, made using hardware, such as via an application specific integrated circuit (ASIC) or FPGA, or as a combination of hardware and software , such as through a microprocessor system and a set of instructions to implement the functionality of the module, which (while running) transforms the microprocessor system into a special device. A module can also be implemented as a combination of the two, with some functions facilitated only by hardware and other functions facilitated by a combination of hardware and software. In some implementations, at least part, and in some cases all, of a module can run on the processor of a computer system. Accordingly, each module can be made in a variety of suitable configurations and need not be limited to a particular implementation exemplified herein. [0105] For the sake of clarity, not all routine features of the aspects are reported here. It would be desirable that in the development of any actual implementation of this disclosure numerous implementation-specific decisions were made in order to achieve the specific goals of the developer, and these specific goals will vary between different implementations and different developers. It is understood that such a development effort could be complex and time-consuming, but it would still be an easy feat for those with ordinary skill in the art who take advantage of this disclosure. [0106] Furthermore, it is understood that the phraseology or terminology used herein is for the purpose of description and not of restriction, so that the terminology or phraseology of this specification is to be interpreted by those skilled in the art in the light of the teachings and of the guidelines presented here, in combination with the knowledge of those proficient in the relevant art or arts. Furthermore, no term in the specification or claims shall be given an uncommon or special meaning, unless it is explicitly stated as such. [0107] The various aspects illustrated herein encompass the known present and future equivalents of the known modules to which reference is made for illustrative purposes. Furthermore, while the aspects and applications have been shown and described, it would be apparent to those skilled in the art and availing themselves of the present disclosure that many more modifications than those mentioned above are possible without departing from the concepts of the invention set forth herein.
权利要求:
Claims (20) [1] 1. A method of controlling target individuals using remotely piloted aircraft (UAV), where the method includes:employ one or more UAVs in an individual's location;obtain information about the individual from one or more UAVs and from external sources;assign an aggression factor to the individual on the basis of the information obtained;Following the determination that the aggression factor is greater than an aggression threshold, prepare a neutralization action aimed at reducing the individual's aggressiveness factor:identify one or more conditions of exception of the individual; Andselect, from a plurality of neutralization actions that the one or more UAVs are capable of performing, a neutralization action based on one or more exception conditions; Andorder the one or more UAVs to perform the selected neutralization action on the individual. [2] 2. The method of claim 1, where to identify one or more conditions of exception of the individual includes:acquire an image of the individual's face;determine an identifier of the individual by performing facial recognition on the face image;generate a VIP white list database from external sources, where neutralization actions cannot be performed on VIPs; Anddetermine that the identifier is in the white list database, where the one or more exception conditions indicate that the individual is a VIP. [3] 3. The method of claim I, where to identify one or more conditions of exception of the individual includes:acquire an image of the individual's face;determine an identifier of the individual by performing facial recognition on the face image;generate a black list database of dangerous individuals from external sources;determine that the identifier is in the black list database, where the one or more exception conditions indicate that the individual is a dangerous individual. [4] 4. The method of claim 1, where to identify one or more conditions of exception of the individual includes:acquire an image of the individual's face;determine an identifier of the individual by performing facial recognition on the face image;search, in a medical database, for the medical records of the individual using the identifier; Andidentify one or more conditions of exception in the individual's medical records. [5] 5. The method of claim 1, where identifying the individual's condition of exception includes:acquire an image of the individual's face;identify, using object recognition, the physical attributes of the individual; and predict the condition of exception based on the physical attributes of the individual. [6] The method of claim 1, where the individual is part of a plurality of target individuals in the position, further comprising:detecting at least one other individual in the plurality of target individuals;determining that an aggression factor of the at least one other individual is above the aggression threshold;identify another condition of exception of the at least one other individual; Andselecting, from among the plurality of neutralization actions, another neutralization action for the at least one other individual on the basis of another condition of exception; Andorder the one or more UAVs to perform the other neutralizing action at the same time as the neutralizing action, where the neutralizing action is different from the other neutralizing action. [7] 7. The method of claim 6, where the neutralizing action consists in spraying a first substance on the individual by means of a first spray gun of the UAV and another neutralizing action is in spraying a second substance on at least another individual using a second spray gun of the UAV, where the second substance is different from the first substance. [8] The method of claim 1, where the neutralizing action consists of spraying a substance on the individual, further comprising:determine, by means of a depth sensor, the position of the individual and a distance between the individual and the UAV;calculate a projection vector indicating the direction in which to spray the substance;determine a concentration of the substance safe for the individual based on the exception condition;adjust the distance between the individual and the UAV to achieve concentration; Andspray the substance along the projection vector in such a way that the individual is exposed to the determined concentration. [9] The method of claim 8, where calculating the projection vector also includes:determine in advance the position of the individual, based on the information obtained about the individual, where the information obtained includes movement data; Andset the projection vector along a UAV position and the predicted position. [10] The method of claim 1, where the exception condition comprises at least one of the following features:(1) age,(2) a disability,(3) an allergy,(4) a pathology,(5) a pregnancy,(6) the use of medical devices,(7) a VIP status,(8) a criminal record, e(9) possession of a weapon. [11] 11. The method of claim 1 further comprises:after performing the neutralizing action, determine whether the individual's aggression factor has dropped below the aggression threshold based on movement data over a period of time; Andfollowing the determination that the aggression factor has not dropped below the aggressiveness threshold, perform a second neutralization action of the plurality of neutralization actions. [12] 12. The method of claim 1, where the neutralizing action is one of the following:(1) spraying a substance on the individual,(2) make a sound,(3) shining lights on the individual,(4) send a picture of the individual to law enforcement, e(5) call a law enforcement officer. [13] 13. A system for controlling target individuals using remotely piloted aircraft (UAV), where the system includes:at least one control center processor configured to:employ one or more UAVs in an individual's location;obtain information on the individual from one or more UAVs and from external sources;assign an aggression factor to the individual on the basis of the information obtained;Following the determination that the aggression factor is above an aggression threshold, prepare a neutralization action aimed at reducing the individual's aggressiveness factor:identify one or more conditions of exception of the individual; Andselect, from a plurality of neutralization actions that one or more UAVs are capable of performing, a neutralization action based on one or more exception conditions; Andorder the one or more UAVs to perform the selected neutralization action on the individual. [14] The system of claim 13, where at least one processor is configured to identify one or more individual exception conditions:acquire an image of the individual's face;determining an identifier of the individual by performing facial recognition on the face image;generate a VIP white list database from external sources, where neutralization actions cannot be performed on VIPs; Anddetermine that the identifier is in the white list database, where the one or more exception conditions indicate that the individual is a VIP. [15] The system of claim 13, wherein at least one processor is configured to identify one or more individual exception conditions:acquire an image of the individual's face;determine an identifier of the individual through facial recognition on the face image;generate a black list database of dangerous individuals from external sources;determine that the identifier is in the black list database, where the one or more exception conditions indicate that the individual is a dangerous individual. [16] The system of claim 13, wherein at least one processor is configured to identify one or more individual exception conditions:acquire an image of the individual's face;determine an identifier of the individual through facial recognition on the face image;search, in a medical database, for the medical records of the individual with the help of the identifier; Andidentify one or more conditions of exception in the individual's medical records. [17] The system of claim 13, wherein at least one processor is configured to identify the individual's exception condition:acquire an image of the individual's face;identify, through object recognition, the physical attributes of the individual; and predict the condition of exception based on the physical attributes of the individual. [18] The system of claim 13, wherein the individual is part of a plurality of target individuals at the location, and wherein at least one processor is further configured to:detecting at least one other individual in the plurality of target individuals;determining that the aggression factor of the at least one other individual is above the aggression threshold;identify another condition of exception of the at least one other individual; Andselecting, from among the plurality of neutralization actions, another neutralization action for the at least one other individual on the basis of another condition of exception; Andorder the one or more UAVs to perform the other neutralizing action at the same time as the neutralizing action, where the neutralizing action is different from the other neutralizing action. [19] 19. The system of claim 18 wherein the neutralizing action comprises spraying a first substance onto the individual by means of a first spray gun of the UAV and the other neutralizing action comprises spraying a second substance onto the at least one other individual using a second spray gun of the UAV, where the second substance is different from the first substance. [20] 20. A non-transient computer medium that stores executable instructions to harmlessly control target individuals using remotely piloted aircraft (UAV), including instructions for:employ one or more UAVs in an individual's location;obtain information on the individual from one or more UAVs and from external sources;assign an aggression factor to the individual on the basis of the information obtained;Following the determination that the aggression factor is above an aggression threshold, prepare a neutralization action aimed at reducing the individual's aggressiveness factor:identify one or more conditions of exception of the individual; Andselect, from a plurality of neutralization actions that the one or more UAVs are capable of performing, a neutralization action based on one or more exception conditions; Andorder the one or more UAVs to perform the selected neutralization action on the individual.
类似技术:
公开号 | 公开日 | 专利标题 US9373014B1|2016-06-21|Systems and methods for event monitoring using aerial drones US9989965B2|2018-06-05|Object detection and analysis via unmanned aerial vehicle US10504348B2|2019-12-10|Virtual enhancement of security monitoring US10600295B2|2020-03-24|System and method for threat monitoring, detection, and response US9858947B2|2018-01-02|Drone detection and classification methods and apparatus US10810845B2|2020-10-20|Security system for detecting hazardous events and occupants in a building CN107765846A|2018-03-06|System and method for using the sensor network across building to carry out the far distance controlled based on gesture US10489649B2|2019-11-26|Drone data locker system Abrahamsen2015|A remotely piloted aircraft system in major incident management: concept and pilot, feasibility study US10604257B2|2020-03-31|Unmanned aerial vehicle for air sampling US20200031470A1|2020-01-30|Drone authentication system CH717275A2|2021-09-30|Systems and methods for the control of target individuals using remotely piloted aircraft. WO2020015682A1|2020-01-23|System and method for controlling unmanned aerial vehicle US20170085759A1|2017-03-23|Method and appartaus for privacy preserving optical monitoring Dhall et al.2020|A survey on systematic approaches in managing forest fires Kang et al.2019|A software platform for noise reduction in sound sensor equipped drones AU2020102304A4|2020-10-29|I-Drone: INTELLIGENT DRONE TO DETECT THE HUMAN AND PROVIDE HELP KR20200115933A|2020-10-08|Rescue System and method for Alarm Broadcasting in mountain area using UAS US10810854B1|2020-10-20|Enhanced audiovisual analytics KR20210047490A|2021-04-30|Fire risk predication system using unmanned aerial vehicle and method thereof US20200285249A1|2020-09-10|Systems and methods for threat response Saadat et al.2018|An application framework for forest fire and haze detection with data acquisition using unmanned aerial vehicle US20180150695A1|2018-05-31|System and method for selective usage of inference models based on visual content US10140835B2|2018-11-27|Monitoring of vectors for epidemic control Kao et al.2019|Intelligent search, rescue, and disaster recovery via internet of things
同族专利:
公开号 | 公开日 US20210300549A1|2021-09-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US11262165B2|2020-06-10|2022-03-01|Jacob W. Bilbrey|Autonomous and automatic weapon subsystem for drones|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US16/833,538|US20210300549A1|2020-03-28|2020-03-28|Systems and methods for subduing target individuals using unmanned aerial vehicles| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|